Americans under 50, as the cliche goes, were raised by the mass media. And this fall, as grown children sometimes do, some of them began to neglect their mother. On the major broadcast TV networks, ratings among viewers 18 to 49 years old (the group most closely watched by advertisers) were down 8%. The drop-off was even worse among men under 35, the couch potatoes of the future. The rejection was almost poignant. You don't call? You don't write? It would kill you to pick up a remote?
The networks which stood to lose hundreds of millions of ad dollars blamed the Nielsen rating service. Advertisers blamed the programming. But the real blame belongs to a historical force more powerful than a Nielsen box, more pernicious than a stack of bad Coupling scripts and not limited to TV: the end or at least the extreme makeover of the mass-media audience as we have known it.
|
||||||||||||||
|
For more than two decades the networks have competed with cable. Now they also vie with home video, computer games and the Paris Hilton sex tape on the Internet. The old three-network system swore by L.O.P., least objectionable programming. Now sizable chunks of the audience, especially young viewers, demand most objectionable programming unusual, gross, risque. If you don't give it to them, they'll watch Punk'd or play Manhunt instead. If you do, you may lose your other viewers to HGTV or Lifetime. In the most mass of mass media, it is no longer possible to please most of the people most of the time.
But this is not only TV's curse. (Or blessing? More on that later.) In all of entertainment we are moving from the era of mass culture to the era of individual culture. Ask the music-biz professionals, if you can talk them off the ledges outside their offices. Album sales were down more than 5% from 2002's already dismal results, thanks largely to illegal music downloading. Legitimate online sellers like iTunes threaten to kill the album, the format that made entertainers into auteurs in the rock era, and to usher in the era of every man his own mix master. The movie industry has not been as badly hit by piracy yet but it went through a summer of surefire hits (Hulk, the Tomb Raider and Charlie's Angels sequels) that weren't. What's saving that business is DVDs now a greater source of revenue than the box office whose appeal is that, by offering special features, extra scenes and alternative camera angles and endings, they allow everyone to watch the same movie differently and separately. (Cannily, the apocalyptic chiller 28 Days Later was released in theaters with two endings a made-for-theaters DVD.) The New York Times Magazine recently heralded theater for one mini-plays designed to be seen by one person at a time.
There are two stories here, a business one and a cultural one. The business one should not deeply interest you unless you were hoping a Hollywood mogul would buy you a Hummer for Christmas this year. But the cultural story is about all of us the Whitmanian, immigrant America of contradictory multitudes. Americans do not have a shared ethnic past or state religion. We have Jessica Simpson. Once, when tens of millions of people listened to the same summer hits, watched the same sitcoms and cried together in movie houses, the mass media defined what mainstream meant what ideals we valued, how much change we would tolerate. If it's harder and harder to define mainstream pop culture, is there a mainstream at all?
Of course, no sooner had the printing press been invented than some pundit was probably bemoaning how people, individually consuming those newfangled "books," would lose the community spirit engendered by Passion plays and witch burnings. And it's worth remembering that mass culture was a 20th century anomaly. Before film and broadcasting, the idea of a giant country, much less the world, sharing a common culture was ludicrous. Travel 100 miles or so, and you'd encounter different dialects, values and folkways. Even religion could spread only so far before being locally amended by, say, a king needing a quickie divorce. Mass culture flattened out dialects and provided new Americans with a quick if superficial means of assimilation. But it developed only because the technology for mass communication was invented before the technology for mass choice. In the late 1940s some 80% of TVs tuned in to Texaco Star Theater because, yes, Milton Berle was funny but in part too because not much else was on.
But if mass media was a technological accident, it was also an idea, in synch with other ideas of its time. It was part of the mid-20th century age of bigness, centralization and consolidation Big Government, the draft, central cities, UNIVACs, lifetime employment and evil empires you could find on a map. And its decline is in synch with a world that is increasingly decentralized, atomized and a la carte tax revolts, the volunteer "Army of One," suburbs, the Web, job hopping and stateless terrorism.
As the war in Iraq showed, social and cultural fragmentation can mirror and even abet each other. Normally you can count on war to bring a country together, as happened for a while after 9/11. But Iraq quickly found the U.S. divided, both within itself Michael Moore at the Oscars and the Dixie Chicks vs. Toby Keith and against much of the rest of the globe. There was a corresponding theme of us or rather U.S.--against the world in 2003's pop culture. Overseas artists critiqued America for the way it reacted to 9/11 (in the short-film anthology 11'09"01 and at "The American Effect" at the Whitney) and for its pop-culture excesses (in the London opera Jerry Springer). Joe Millionaire 2 featured a fresh-scrubbed cowboy from Texas romancing 14 worldly European bachelorettes under the pretense that he was a multimillionaire a devilish if inadvertent satire of U.S.-Europe relations, playing off each side's worst stereotypes of the other (the lying cowboy vs. corrupt, chain-smoking Old Worlders). Maybe the most plangent treatment of American isolation was Sofia Coppola's Lost in Translation, with Bill Murray and Scarlett Johansson as Americans in a Tokyo so alien, it might as well have been Neptune.
And in Iraq, unlike Vietnam, there was no Walter Cronkite to speak for the great middle. Ratings for cable news shot up, while big-network newscasts stayed level or even dropped. Some viewers' media choices became a kind of political secret handshake. Pro-war, you watched Fox News, learned that the war was a rout and disdained the liberal big media. Antiwar, you watched BBC News or al-Jazeera on satellite learned that the war was a quagmire and disdained the jingoistic big media. Pox on both your houses, you watched Jon Stewart.
Or you voted none of the above. What network did the most people watch the night the ground war began? NBC. While ABC and the Fox network went with war news, the Peacock had the sense, bravery and civic responsibility to air ... Friends.
In an overentertained, overmediated society, mainstream culture becomes more and more a secondhand experience. We are less influenced by books, movies, CDs and plays who has the time?--than by what we hear about them through the media. Queer Eye for the Straight Guy, for instance, helped prompt a national seminar on gay-straight relations even though only a couple of million of us actually watched any given episode. Only so many people were technologically intrepid enough to track down the Hilton video online but the so-called scandal (which was what, exactly that a woman had sex with her boyfriend?) helped draw millions to her reality show, The Simple Life. We may not have watched the MTV Video Music Awards but we all knew about Britney tongue-wrestling Madonna.
Amid all this media-generated controversy, it could be difficult for a creative work itself to stir up the culture. In 1994 Quentin Tarantino's Pulp Fiction generated volumes of discussion about movie violence. In 2003 Kill Bill Vol. 1--which made Pulp look like Toy Story landed nearly as softly as villainess Lucy Liu did when she collapsed bloodily into the snow in its climax. Dan Brown's The Da Vinci Code, involving a theory that Mary Magdalene may have been Jesus' wife and the mother of his child, intrigued readers and sold millions of copies, but it was ABC News that really took religious fire when it raised the same question in a prime-time special. In fact, it was easier for a work to provoke discussion if no one saw it. Possibly the most debated works of 2003 were The Passion of the Christ, Mel Gibson's unfinished movie about the Crucifixion; The Reagans, a TV biopic that no one outside CBS saw before the network canceled it under protest; and Daniel Libeskind's World Trade Center rebuilding design, which spent most of the year on the redrawing board.
This dichotomy between the buzz culture and the culture we actually consume also created two kinds of celebrities: those we wanted to see on the screen or hear on the radio and those we just wanted to read about in Us or PEOPLE. Occasionally, the categories overlapped, as with Beyonce, who conquered the news racks and the CD racks. But in other cases notably Ben and Jen and Gigli fame and commercial fortune were, if anything, inversely proportional. And whereas 2002 gave us famous has-beens, like Ozzy Osbourne and Anna Nicole, 2003 was the year of famous never-weres. Ally Hilfiger and Jamie Gleicher of MTV's Rich Girls, for instance, seem to have been created out of thin air so we could envy and sneer at them at once.
Notoriety still paid in 2003, to an extent. Rapper 50 Cent parlayed a tabloid-lurid story he has been shot, he claims, nine times into the year's top-selling album. And Demi Moore helped her celebrity profile by hooking up with Ashton Kutcher more, probably, than she helped her summer flick, Charlie's Angels: Full Throttle. But whom did we actually want to see in a movie?
We'll have the fish, please.
Finding Nemo was the kind of exception that proves that just when you're ready to declare the mainstream dead, it swims up and bites you on the tush. The year's top-grossing movie was also an example of just what it takes in a culture broken down by tribes and ages and demographics to make an across-the-board hit. People flocked to Nemo because it was a good movie, of course. It was moving, it was beautifully animated. And who doesn't like a good ink-spurting joke? But more important, it was about easy-to-agree-with universals: loving your family, learning to live with risks. (It was the sort of movie that, before the statute of limitations expired, we would have called "post-Sept. 11.") And it had a cast whose appeal was not laser-targeted toward young urban males or moms over 40. Black or white, young or old, liberal or conservative, we all feel pretty much the same about fish, except that some of us don't like tartar sauce.
Other times, though, you don't immediately recognize the voice of the mainstream even when it shows up on your TV and belts out Mack the Knife. Clay Aiken, the skinny, geeky American Idol runner-up who was the year's surprise recording star, was, you might say, so mainstream that he was weird: a straitlaced, smiling, asexual whippet who loved to sing standards. Idol's judges, and record-company execs, doubted that Clay could make it in the pop market of 2003. One multiplatinum CD later Measure of a Man, so pure it floats he proved them wrong and showed that in some ways the mainstream is now itself a niche.
Aiken's sales, by the way, outstripped those of Ruben Studdard, the moon-faced R.-and-B. crooner who won Idol. So who speaks for the mainstream, the TV audience voting with its phones or the music audience voting with its wallets? Is a thing mainstream only to the extent that we're willing to pay money for it?
Well, that doesn't hurt this is America. Aikenmania also showed how the culture is increasingly in the hands of nontraditional commercial tastemakers like Wal-Mart. Measure was sold largely to kids and parents in checkout lines people who might never set foot inside a record store. With almost 3,000 locations in the U.S., Wal-Mart is more of a broadcaster than NBC is. And it's using that power culturally deciding this year, for instance, to exclude racy "lad" magazines like Maxim from its news racks. Big discounters also helped popularize conservative-pundit books and the Veggie Tales Christian videos. Likewise, Queer Eye brought us together through consumerism: gay or straight, it said, we stood united in our need to blow $40 on a bottle of moisturizer. (The show was perhaps the most visible sign of masstige, the fashion world's rather oxymoronic new term for bringing prestige style to the masses Isaac Mizrahi at Target, for instance.) The Fab Five made gay culture cross over with promises of bourgeois paradise, just as 50 Cent, Jay-Z and many before them brought hip-hop culture over with tales of bling-bling. Big pimpin', meet big primpin'.
But wait a second here. Which one is mainstream? The Fab Five, showing up to make over and showily flirt with an ex-Marine (who whipped up a lovely souffle)? Or Aiken, who cut the patriotic single God Bless the USA with his Idol mates during the war and strenuously purged sex any kind of sex from his music and persona? 50 Cent, who sold more than 6 million copies of Get Rich ("I'm high all the time/I smoke that good s___")? Or Wal-Mart, which carried only the bowdlerized version of his album? The de-religionized spirituality of Mitch Albom's No. 1-selling The Five People You Meet in Heaven? Or the literalistic Christianity of the No. 1-selling thriller Armageddon, from the Left Behind series? Fox News, which carried the flag (in the corner of its screen) for Bush's war? Or the Fox network, which scandalized cultural conservatives with its reality shows and aired The O.C., The Simple Life and Arrested Development, three of 2003's strongest pop-culture jabs at the rich?
In American culture, as in American politics, it was possible to assemble a case for two entirely different visions of the mainstream: one libertine, irreverent and p.c., the other traditional, devout and PG. It's tempting to borrow the electoral blue-state/red-state template and say there are two mainstreams, equal and opposite but that beggars the definition of mainstream, no? The year 2003, we've heard, was when the swing voter became irrelevant. It could be that our pop culture too no longer has that swing.
Unless pardon me, Carson Kressley, for the pun it swings both ways.
The year 2003 introduced a new phrase to the cultural vocabulary: flash mob, an instant gathering of people, organized on the Internet, who receive an e-mail or cell-phone message, show up en masse at a designated spot, perform some absurd act (quack like ducks, bang their shoes on the pavement) and then disperse.
Mainstream culture today is like a flash mob. Those who are part of it know they're part of it, even if it doesn't congregate as often as it did back when 30 million people would watch a network show on a typical night. Every so often, we get the call we gather for Joe Millionaire or buy that Harry Potter book. Then, show over, book read, we scatter: back to VH1 or our Scarface DVDS or our scrapbooking chat rooms.
Increasingly, the events that most deeply, if briefly, unite that floating mainstream are deaths: Johnny Cash, Bob Hope, Katharine Hepburn. The intensity of response to the passing of John Ritter, a likable actor from a campy '70s sitcom, seemed to surprise even his fans. In a culture with few common cultural referents, the past is what we share the most. (Perhaps for the same reason, 2003's Broadway shows with broad mass appeal tended to be revivals like Long Day's Journey into Night and Wonderful Town and the music business heaved up a slew of standards albums.) When old stars pass, they take with them a piece of a time when we weren't so niched and subdivided by the market and our own choices. To make the metaphor a little homier, the pop-culture mainstream is a family that used to get together for dinner once a week but now does so only at weddings (or dating-show finales, anyway)--and funerals.
But don't mourn those old days. However community-building the old big aggregators were (the three networks, Top 40 radio), they also tended to kill idiosyncrasy (with a few hard-fought exceptions like Cash). That cable serves smaller audiences allowed it this year to produce more polarizing but better TV: FX's Nip/Tuck, ESPN's Playmakers, HBO's Angels in America. (Though, granted, as the debate over the FCC's media-ownership rules noted, most of the open mouths providing those voices are still connected to the corporate lungs of a few giant media companies.) And if iPod users pick and choose singles rather than pay $18 for filler-loaded albums (which were invented more for business than artistic reasons in the first place), it frees them to sample more genres and artists. The trade-off is a flightier, more mercurial and more tabloid pop culture. Its one unifying trait, perhaps, is simply the desire to check out what all the fuss is about. But at least we're still connected enough to care about one another's fusses now and again.
The monolithic mainstream culture of the 20th century helped define what it meant to be American. But it was un-American at heart. The phrase E pluribus unum aside, America was founded on fragmentation by people fleeing religious, political and cultural "community" in the Old World. Nearly 200 years ago, Alexis de Tocqueville wrote that a strength of the new nation was its abundance of space. Here, unlike in Europe, the citizens could be united when they needed to and be alone when they wanted to. In an older, more crowded America, we find that space virtually inside a screen, a book, a set of headphones. This is our last frontier, and it goes on forever.